# 1024-dimensional vector space
Persian Sentence Embedding V3
This is a sentence transformer model fine-tuned on xlm-roberta-large, which can map text to a 1024-dimensional vector space and support multilingual semantic understanding tasks.
Text Embedding
P
Msobhi
751
2
E5 Large Korean
MIT
A Korean sentence embedding model fine-tuned based on multilingual-e5-large, supporting 1024-dimensional vector space mapping, suitable for tasks such as semantic similarity calculation
Text Embedding
Transformers Supports Multiple Languages

E
upskyy
2,222
2
Sentence Transformers Multilingual E5 Large
This is a multilingual sentence embedding model based on sentence-transformers, capable of mapping text to a 1024-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding
S
embaas
53.70k
2
Polish Sts V2
This is a Polish-language sentence embedding model capable of mapping sentences and paragraphs into a 1024-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding
Transformers Other

P
radlab
43
2
Gbert Large Paraphrase Cosine
MIT
A German text embedding model based on the sentence-transformers framework, capable of mapping text to a 1024-dimensional vector space, specifically designed to enhance few-shot text classification performance in German.
Text Embedding
Transformers German

G
deutsche-telekom
21.03k
23
Featured Recommended AI Models